30 research outputs found

    Methodology for Optimization of Polymer Blends Composition

    Get PDF
    The research of polymer blends, or alloys, has experienced enormous growth in size and sophistication in terms of its scientific base, technology and commercial development (Paul & Bucknall, 2000). As a consequence two very important issues arise: the increased availability of new materials and the need for materials with better performance. Polymer blends are polymer systems originated from the physical mixture of two or more polymers and/or copolymers, without a high degree of chemical reactions between them. To be considered a blend, the compounds should have a concentration above 2% in mass of the second component (Hage & Pessan, 2001; Ihm & White, 1996). However, the commercial viability of new polymers has begun to become increasingly difficult, due to several factors. The advantages of polymer blends lie in the ability to combine existing polymers into new compositions obtaining in this way, materials with specific properties. This strategy allows for savings in research and development of new materials with equivalent properties, as well as versatility, simplicity, relatively low cost (Koning et al., 1998) and faster development time of new materials (Silva, 2011). Rossini (2005) mentions that economically and environmentally, a very viable alternative is to replace the recycling of pure polymers by mixtures of discarded materials. Mechanical recycling causes the breakdown of polymer chains, which impairs the properties of polymers. This degradation is directly proportional to the number of cycles of recycling. Therefore, the blend of two or more discarded polymers can be a realistic alternative, since it can result in materials with very interesting properties, at a low cost. Besides its inexpensiveness, this choice is also a smart solution to the reutilization of garbage. Postconsumption package disposal always occurs in a disorderly manner and without regard for the environment. The recycling process becomes increasingly more important and necessary to remediate environmental impact. According Pang et al. (2000) apud Marconcini & Ruvolo Filho (2006) polyolefins such as high density polyethylene (HDPE), low density polyethylene (LDPE) and polypropylene (PP) and polyesters such as poly (ethylene terephthalate) (PET) are classes of thermoplastics that have been widely used in packaging and constitute a large part of post-consumer waste. The recycling of these materials and their mechanical characterization anticipating the possibility of a new cycle of life in the form of new products is challenging, although technologically and environmentally correct (Marconcini & Ruvolo Filho, 2006). The polymer blends can be obtained basically in two ways (Rossini, 2005): ï‚· By dissolving the polymers in a good solvent, common to them, and subsequently letting the solvent evaporate; and ï‚· In a mixer where the working temperature is high enough to melt or mollify the polymeric components, without causing degradation of the same. According to Wessler (2007), the polymer blends may be miscible or immiscible. The miscibility is the most important property to be analyzed in a blend, given that all other system properties depend on the number of phases, their morphology and adhesion between them. The miscibility term is directly related to the solubility, i.e., a blend is miscible when the polymers dissolve in each other mutually (Silva, 2011). The immiscible between the various engineering polymers is a limiting factor for its production. Thus, it is necessary to use compatibilization agents for their production. Computational modeling has become increasingly popular. The main objective of models is to assist process optimization with minimal investment of time and resources for experimental work. Most techniques are classified into two main groups: physical models and statistical models as shown by Malinov & Sha (2003). Statistical methods are chosen according to research objectives. There are several multivariate analysis methods for purposes quite different from each other. The desired value and quality of one or more product characteristics can be obtained via experiment analysis and DOE. These methods help determining optimal settings and controllable factors of a process such as: temperature, pressure, amount of reagents, operating time, etc.. When compared to the method of trial and error, DOE also allows a reduction of the number of required tests, and savings in time, labor and money. An important application of DOE is the optimization of experimental formulations as, for example, the composition of mixtures. The formulation development is a fundamental part of the food industry, chemicals, plastics, rubber, paints, medicines, and the like. In materials science, it is important to understand the correlation between material processing, microstructure and properties that enable the optimization of process parameters and compositions of materials to achieve the desired combination of properties, according Malinov & Sha (2003). The problem presented here is to determine the fraction of each polymer blend component, and to determine the agent or, in some cases, an agents system, when it is necessary to use more than one compatibilizing agent. Thus, this text studies the effect of factors, for example, amount of polypropylene, additive type, and amount of additive in the composition of polymer blends, i.e., the optimal polymer blends formulation using factorial design

    Report on the International Congress on Health Sciences and Medical Technologies

    Get PDF
    The International Congress on Health Sciences and Medical Technologies is a periodical congress. It held three times regrouping several researchers and organization active in technology and medical research. With the contribution of universities, research centers, companies and publishers the congress was established at Algerian cities (Tlemcen, Algiers). This report describes the congress achievement through these three past years and announces its’ new edition

    Explainer: An interactive Agent for Explaining the Diagnosis of Cardiac Arrhythmia Generated by IK-DCBRC

    Get PDF
    Interactions between medical applications and users involve a high level of trust, since many complex, automated applications are integrated and involve critical domains in which public health is paramount. Although uncertainty decreases the accuracy and trust of such medical applications under these circumstances, explanation-aware computing becomes crucial in improving the efficiency of these applications. This paper describes an intelligent agent that interacts with users to provide meaningful explanations of previous diagnoses supported by IK-DCBRC. The agent ensures intelligent interactions with users via a rule-based system that generates appropriate explanations according to the selected level of abstraction and the detected cardiac arrhythmia. The paper also describes a particular medical application, that is, cardiac arrhythmia with automatic diagnoses supported by the case-based reasoning classifier, IK-DCBRC

    A Compact Sift-Based Strategy for Visual Information Retrieval in Large Image Databases

    Get PDF
    This paper applies the Standard Scale Invariant Feature Transform (S-SIFT) algorithm to accomplish the image descriptors of an eye region for a set of human eyes images from the UBIRIS database despite photometric transformations. The core assumption is that textured regions are locally planar and stationary. A descriptor with this type of invariance is sufficient to discern and describe a textured area regardless of the viewpoint and lighting in a perspective image, and it permits the identification of similar types of texture in a figure, such as an iris texture on an eye. It also enables to establish the correspondence between texture regions from distinct images acquired from different viewpoints (as, for example, two views of the front of a house), scales and/or subjected to linear transformations such as translation. Experiments have confirmed that the S-SIFT algorithm is a potent tool for a variety of problems in image identification

    Proposal for Medical Data Transmission in Healthcare Systems

    Get PDF
    Background: Information systems used in hospitals are slow and consume a lot of system memory, facilitating crashes, impacting patients seeking consultation face long waiting periods by a medical specialist; Still considering that exchange patient data and medical consultations in system interconnected between hospitals, for scheduling of consultations may become even more latent.Methods: Aiming to solve such problems, the present study implements modeling with discrete-event technology applied to a healthcare system, modulating the signal transmitted with the DQPSK format, through the simulation environment, the Simulink of the MATLAB software, improving the transmission of data, through a pre-coding process of bits adopting discrete events in the signal before modulation.Results: This study aims to increase the information capacity for healthcare systems, bringing a new approach for signal transmission, undertaken in the discrete domain employing the discrete entities in the bit generation process, this use being the differential applied on the bit itself, in the physical layer, showing better computational performance regarding memory utilization related to compression of information, showing an improvement of 101.52%.Conclusion: The proposal developed has the properties of improving the capacity of hospital services and can increase the performance of the communication between all medical devices, this positive impact is the result that the data stream will consume fewer communication resources
    corecore